Learnability of Gaussians with Flexible Variances

نویسندگان

  • Yiming Ying
  • Ding-Xuan Zhou
چکیده

Gaussian kernels with flexible variances provide a rich family of Mercer kernels for learning algorithms. We show that the union of the unit balls of reproducing kernel Hilbert spaces generated by Gaussian kernels with flexible variances is a uniform Glivenko-Cantelli (uGC) class. This result confirms a conjecture concerning learnability of Gaussian kernels and verifies the uniform convergence of many learning algorithms involving Gaussians with changing variances. Rademacher averages and empirical covering numbers are used to estimate sample errors of multi-kernel regularization schemes associated with general loss functions. It is then shown that the regularization error associated with the least square loss and the Gaussian kernels can be greatly improved when flexible variances are allowed. Finally, for regularization schemes generated by Gaussian kernels with flexible variances we present explicit learning rates for regression with least square loss and classification with hinge loss.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

When Does Determinacy Imply Expectational Stability?

We study the connections between determinacy of rational expectations equilibrium, and expectational stability or learnability of that equilibrium, in a relatively general New Keynesian model. Adoption of policies that induce both determinacy and learnability of equilibrium has been considered fundamental to successful policy in the literature. We ask what types of economic assumptions drive di...

متن کامل

Product of Gaussians as a distributed representation for speech recognition

Distributed representations allow the effective number of Gaussian components in a mixture model, or state of an HMM, to be increased without dramatically increasing the number of model parameters. Various forms of distributed representation have previously been investigated. In this work it shown that the product of experts (PoE) framework may be viewed as a distributed representation when the...

متن کامل

Variance adaptive shrinkage (vash): flexible empirical Bayes estimation of variances

MOTIVATION Genomic studies often involve estimation of variances of thousands of genes (or other genomic units) from just a few measurements on each. For example, variance estimation is an important step in gene expression analyses aimed at identifying differentially expressed genes. A common approach to this problem is to use an Empirical Bayes (EB) method that assumes the variances among gene...

متن کامل

An efficient approximate method for solution of the heat equation using Laguerre-Gaussians radial functions

In the present paper, a numerical method is considered for solving one-dimensional heat equation subject to both Neumann and Dirichlet initial boundary conditions. This method is a combination of collocation method and radial basis functions (RBFs). The operational matrix of derivative for Laguerre-Gaussians (LG) radial basis functions is used to reduce the problem to a set of algebraic equatio...

متن کامل

Robot Docking Using Mixtures of Gaussians

This paper applies the Mixture of Gaussians probabilistic model, combined with Expectation Maximization optimization to the task of summarizing three dimensional range data for a mobile robot. This provides a flexible way of dealing with uncertainties in sensor information, and allows the introduction of prior knowledge into low-level perception modules. Problems with the basic approach were so...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 8  شماره 

صفحات  -

تاریخ انتشار 2007